166 research outputs found

    Evolution, complexity and artificial life

    Get PDF
    Evolution and complexity characterize both biological and artificial life – by direct modeling of biological processes and the creation of populations of interacting entities from which complex behaviors can emerge and evolve. This edited book includes invited chapters from leading scientists in the fields of artificial life, complex systems, and evolutionary computing. The contributions identify both fundamental theoretical issues and state-of-the-art real-world applications. The book is intended for researchers and graduate students in the related domains

    Toward Collective Self-Awareness and Self-Expression in Distributed Systems

    Get PDF
    Simultaneously applying hierarchy and recursion enables self-awareness and self-expression in distributed systems, which can provide greater efficiency and scalability in tasks such as network exploration and message routing

    High-Performance Computing and ABMS for High-Resolution COVID-19 Spreading Simulation

    Get PDF
    This paper presents an approach for the modeling and the simulation of the spreading of COVID-19 based on agent-based modeling and simulation (ABMS). Our goal is not only to support large-scale simulations but also to increase the simulation resolution. Moreover, we do not assume an underlying network of contacts, and the person-to-person contacts responsible for the spreading are modeled as a function of the geographical distance among the individuals. In particular, we defined a commuting mechanism combining radiation-based and gravity-based models and we exploited the commuting properties at different resolution levels (municipalities and provinces). Finally, we exploited the high-performance computing (HPC) facilities to simulate millions of concurrent agents, each mapping the individual’s behavior. To do such simulations, we developed a spreading simulator and validated it through the simulation of the spreading in two of the most populated Italian regions: Lombardy and Emilia-Romagna. Our main achievement consists of the effective modeling of 10 million of concurrent agents, each one mapping an individual behavior with a high-resolution in terms of social contacts, mobility and contribution to the virus spreading. Moreover, we analyzed the forecasting ability of our framework to predict the number of infections being initialized with only a few days of real data. We validated our model with the statistical data coming from the serological analysis conducted in Lombardy, and our model makes a smaller error than other state of the art models with a final root mean squared error equal to 56,009 simulating the entire first pandemic wave in spring 2020. On the other hand, for the Emilia-Romagna region, we simulated the second pandemic wave during autumn 2020, and we reached a final RMSE equal to 10,730.11

    ReSS: A tool for discovering relevant sets in complex systems

    Get PDF
    Abstract A complex system can be composed of inherent dynamical structures, i.e., relevant subsets of variables interacting tightly with one another and loosely with other subsets. In the literature, some effective methods to identify such relevant sets rely on the so-called Relevance Indexes (RIs), measuring subset relevance based on information theory principles. In this paper, we present ReSS, a collection of CUDA-based programs computing two of such RIs, either through an exhaustive search or a niching metaheuristic when the system dimension is too large. ReSS also includes a script that iteratively activates the search and identifies hierarchical relationships among the relevant subsets. The main purpose of ReSS is to establish a common and easy-to-use general RI-based platform for the analysis of complex systems and other possible applications

    Automatic evolutionary medical image segmentation using deformable models

    Get PDF
    International audienceThis paper describes a hybrid level set approach to medical image segmentation. The method combines region-and edge-based information with the prior shape knowledge introduced using deformable registration. A parameter tuning mechanism, based on Genetic Algorithms, provides the ability to automatically adapt the level set to different segmentation tasks. Provided with a set of examples, the GA learns the correct weights for each image feature used in the segmentation. The algorithm has been tested over four different medical datasets across three image modalities. Our approach has shown significantly more accurate results in comparison with six state-of-the-art segmentation methods. The contributions of both the image registration and the parameter learning steps to the overall performance of the method have also been analyzed

    A Survey on Evolutionary Computation for Computer Vision and Image Analysis: Past, Present, and Future Trends

    Get PDF
    Computer vision (CV) is a big and important field in artificial intelligence covering a wide range of applications. Image analysis is a major task in CV aiming to extract, analyse and understand the visual content of images. However, imagerelated tasks are very challenging due to many factors, e.g., high variations across images, high dimensionality, domain expertise requirement, and image distortions. Evolutionary computation (EC) approaches have been widely used for image analysis with significant achievement. However, there is no comprehensive survey of existing EC approaches to image analysis. To fill this gap, this paper provides a comprehensive survey covering all essential EC approaches to important image analysis tasks including edge detection, image segmentation, image feature analysis, image classification, object detection, and others. This survey aims to provide a better understanding of evolutionary computer vision (ECV) by discussing the contributions of different approaches and exploring how and why EC is used for CV and image analysis. The applications, challenges, issues, and trends associated to this research field are also discussed and summarised to provide further guidelines and opportunities for future research

    Lazy Network: A Word Embedding-Based Temporal Financial Network to Avoid Economic Shocks in Asset Pricing Models

    Get PDF
    Public companies in the US stock market must annually report their activities and financial performances to the SEC by filing the so-called 10-K form. Recent studies have demonstrated that changes in the textual content of the corporate annual filing (10-K) can convey strong signals of companies’ future returns. In this study, we combine natural language processing techniques and network science to introduce a novel 10-K-based network, named Lazy Network, that leverages year-on-year changes in companies’ 10-Ks detected using a neural network embedding model. (e Lazy Network aims to capture textual changes derived from financial or economic changes on the equity market. Leveraging the Lazy Network, we present a novel investment strategy that attempts to select the least disrupted and stable companies by capturing the peripheries of the Lazy Network. We show that this strategy earns statistically significant risk-adjusted excess returns. Specifically, the proposed portfolios yield up to 95 basis points in monthly five-factor alphas (over 12% annually), outperforming similar strategies in the literature

    Anomaly detection in laser-guided vehicles' batteries: a case study

    Full text link
    Detecting anomalous data within time series is a very relevant task in pattern recognition and machine learning, with many possible applications that range from disease prevention in medicine, e.g., detecting early alterations of the health status before it can clearly be defined as "illness" up to monitoring industrial plants. Regarding this latter application, detecting anomalies in an industrial plant's status firstly prevents serious damages that would require a long interruption of the production process. Secondly, it permits optimal scheduling of maintenance interventions by limiting them to urgent situations. At the same time, they typically follow a fixed prudential schedule according to which components are substituted well before the end of their expected lifetime. This paper describes a case study regarding the monitoring of the status of Laser-guided Vehicles (LGVs) batteries, on which we worked as our contribution to project SUPER (Supercomputing Unified Platform, Emilia Romagna) aimed at establishing and demonstrating a regional High-Performance Computing platform that is going to represent the main Italian supercomputing environment for both computing power and data volume.Comment: This paper contains a report on the research work carried out as a collaboration between the Department of Engineering and Architecture of the University of Parma and Elettric80 spa within project SUPER (Supercomputing Unified Platform Emilia Romagna
    corecore